专利摘要:
MULTI-LANGUAGE SEARCH OPTIONS. The present invention relates to methods, systems and apparatus, which include computer programs encoded on computer storage media, for multi-language image search. One method includes receiving an image search query and image search results responsive to the image search query. Translations of the image search query into other languages are obtained, and search results responsive to each of the translations are received. Instructions are provided for a client device. The instructions give the client device a user interface that includes one or more of the image search results and a multilingual search option for each of the translations. Each multi-language search option includes translation and a preview of translation-responsive image search results
公开号:BR112012012133B1
申请号:R112012012133-2
申请日:2009-11-20
公开日:2021-01-12
发明作者:Zhuohao Zhuohao Wu;Hui Li;Gangjiang Li;Yong Zhang;Guang Hua Li;Boon-Lock Yeo
申请人:Google Llc;
IPC主号:
专利说明:

[0001] This specification refers to image search. Internet search engines provide information about accessible Internet resources (for example, web pages, images, text documents, multimedia content) that are responsive to a user's search query. For example, a user submits an image search query, that is, one or more terms that an Internet search engine uses to search for images, the Internet search engine can generate a group of researchable image search results. for the image search query, and present them to the user. However, for certain queries, translations of the queries in particular languages have better results (for example, more relevant results, more varied results, more numerous results, etc.). For example, if the query is the name of an icon of popular Japanese culture, search queries in Japanese are likely to have better results than search queries in English. Therefore, for a user entering an image search query in a given language for a particular subject, image search results may not be as good as they would be if he or she had entered the same search query for the particular subject in a different language. summary
[0002] This specification describes technologies related to multilingual image search.
[0003] In general, an aspect of the object described in this specification can be incorporated into methods that include the actions of receiving, on a data processing device, a first image search and first image search results that are responsive to the first image search , the first image search is one or more terms in a first language; obtain, by the data processing apparatus, translations of the first image search, in which each translation is a translation of the first image search into a respective second language other than the first language; receive, in the data processing apparatus, for each translation of the first image search, the respective image search results which are determined to be responsible for the translation of the first image search when the translation is used as a search query for Image; and providing first instructions for a client device that, when executed by the client device, cause the client device to have a user interface that includes: one or more of the first image search results responsive to the first image search; and a respective multilingual search option for each of the translations of the first image search, the respective multilingual search option for each translation that includes the translation and a preview of data from the respective responsive image search results to translation, where each multi-language search result is selectable in the user interface. Other embodiments of this aspect include systems, apparatus, and corresponding computer program products recorded on computer storage devices, each configured to perform the operations of the methods.
[0004] These and other modalities can each optionally include one or more of the following characteristics. The preview of the respective image search results responsive to a respective translation of the image search query can be an image associated with one of the respective image search results.
[0005] The method may additionally include, in response to a selection of a first multilingual search result, providing second instructions to the client device which, when executed by the client device, cause the client device to present a user interface that includes a first translation that corresponds to the first of the multilingual search results and the respective image search results that are responsive to the first translation. The second instructions can additionally include instructions that, when executed by the client device, cause the client device to display the first image search. The first image search can be selectable in the user interface. The method may additionally include, in response to a selection of the first image search, providing third instructions to the client device which, when executed by the client device, cause the client device to present a user interface that includes the first search results of image.
[0006] Getting one or more translations selected for the first image search may include receiving a plurality of candidate translations from the first image search; determine a score for each candidate translation; and select the translations from the candidate translations according to the scores. Determining a score for a candidate translation may include determining the score for the candidate translation from a submission measurement frequency that measures how often the candidate translation is received from users as an image search query. Determining a score for a candidate translation may include determining the score for a candidate translation from a review measurement frequency that measures how often search queries in the first language are reviewed by users for corresponding search queries in the respective second language of the translation candidate. Determining a score for a candidate translation may include determining the score for a candidate translation from a percentage of clicks for the candidate translation when the candidate translation is submitted as a search query, where the percentage of clicks measures how often users select search results responsive to the candidate translation. Determining a score for a candidate translation may include determining the score for a candidate translation from a measurement of unique users that estimates the number of unique users who submitted the candidate translation as a search query. Determining a score for a candidate translation comprises determining the score for the candidate translation from a measurement of quantity of results that measures a quantity of image search results responsive to the candidate translation, when the candidate translation is submitted as a query for image search.
[0007] Particular modalities of the object described in this specification can be implemented to obtain one or more of the following advantages. Additional search options can be presented to a user that allow the user to identify relevant additional image search results. Additional search options may be in languages the user is not necessarily familiar with. A user can view a preview of the search results for various search options to estimate how useful each option can be. Other indications that are useful for the search option may be presented to a user, for example, the number of results available with that search option and the name of the language of the search option.
[0008] The details of one or more modalities of the object described in this specification are presented with drawings and description below. Other characteristics, aspects, and advantages of the object will be evident from the description, the drawings, and the claims. Brief Description of the Figures
[0009] Figure 1 is a block diagram of an example environment in which an image search system provides image search. Figure 2A illustrates an example of a search user interface that presents multilingual search options along with search results responsive to a query. Figure 2B illustrates an example of an image search interface that presents image search results responsive to a translation of the query. Figure 3 is a block diagram that illustrates an example of an image search system architecture. Figure 4 is a block diagram that illustrates an example of a translation mechanism. Figure 5 is a flow chart of an example of a user interface generation process for image research
[0010] Similar numbers and reference designations in various drawings indicate similar elements. Detailed Description § 1.0 Example of Research Environment
[0011] Figure 1 is a block diagram of an example environment 100 in which an image search system 110 provides image search services. Sample environment 100 includes a network 102, for example, a local area network (LAN), wide area network (WAN), the Internet, or a combination thereof, which connects websites 104, user devices 106 , and search engine 110. Environment 110 can include multiple websites 104 and user devices 106.
[0012] A website 104 is one or more resources 105 associated with a domain name and hosted by one or more servers. An example website is a collection of web pages formatted in hypertext markup language (HTML) that can contain text, images, multimedia content, and programming elements, for example, scripts (scripts). Each website 104 is maintained by an editor, for example, an entity that manages and / or owns the website.
[0013] A resource is any data that can be provided by website 104 over network 102 and that is associated with a resource address. Resources include HTML pages, word processing documents, and documents in portable document format (PDF), images, video, and power supplies, to name a few. Resources can include content, for example, words, phrases, images and sounds. Resources can also include embedded information (for example, meta information and links) and / or embedded instructions (for example, JavaScript scripts).
[0014] A user device 106 is an electronic device that is under the control of a user and is able to request and receive resources over the network 102. Examples of user devices 106 include personal computers, mobile communication devices, and other devices that can send and receive data over network 102. A user device 106 typically includes a user application, for example, a web browser, to facilitate sending and receiving data over network 102.
[0015] To facilitate resource search, a search engine identifies resources by tracking and indexing resources provided by websites 104. Data on resources can be indexed based on the resource to which the data corresponds. The indexed copies, and optionally stored in temporary memory, of the resources are stored in a temporary indexed buffer memory 112.
[0016] User device 106 submits image search queries 114, and optionally other search queries, to image search system 110. Image search system 110 can be a separate search system, or part of a search system greater than researching other types of resources. In response to receiving an image search query 114 from a user device 106, search appliance 110 uses its search engine 116 to access indexed buffer memory 112 and identify image resources that are relevant to the query of research. Image resources can include images, video, and other visual content. Search engine 116 generates image search results that identify image resources. An image search result is data generated by search engine 116 that identifies an image resource that is responsive to a particular search query, and includes a link to the image resource, or the website that contains the image resource Image. An example image search result can include, for example, a thumbnail image of the image resource, a text fragment, the URL of the web page, and a link to the web page or image resource.
[0017] Search engine 11 can classify image search results based on scores related to the resources identified by the search results, as well as scores for providing example relevant information. Scores related to resources include, for example, information retrieval scores ("IR") and optionally a quality score for each resource relative to other resources. In some implementations, the IR scores are computed from internal characteristic vector products that correspond to a 114 search query and a resource, and the ranking of the search results is based on scores that are a combination of the IR scores and scores Of Quality. For image search results, the relevance information feedback score can be used in combination with the IR score and the quality score when generating scores to rank the image search results. An example of a score for delivering relevant information is a score derived from a percentage of clicks on an image in a search result.
[0018] Instruction mechanism 118 sends instructions 122 to a user device 106 in response to receiving an image search query 114 from the image search device. Instructions 122 cause user device 106 to display a user interface that includes image search results responsive to a query 114 from user device 106, and to present one or more multilingual search options. As used in this document, a "multilingual search option" is a translation of query 114 into another language and a preview of search results responsive to translation when the translation is used as an image search query. Multi-language search options are described in more detail below with reference to Figures 2A and 2B.
[0019] The translations of the query can be exact translations or close translations. An exact translation of query terms in a first language is the corresponding term (s) in a second language which are the definitive equivalents to terms in the first language. A rough translation of query terms in a first language is the corresponding term (s) ^) in a second language which are semantically similar to query terms in the first language, but are not otherwise the definitive equivalents to the terms in the first language. As will be described below, semantic similarity of terms in different languages can be derived from search results, selections, and other similar signals. Translated queries are generated by the translation engine 120.
[0020] User device 106 receives instructions 122, for example, in the form of one or more web pages, and presents the user interface to users. In response to the user selecting a call in a search result on a user device 106, user device 106 requests the resource identified by the call. The website 104 hosting the resource receives the request for the resource from user device 106 and provides the resource to the requesting user device 106.
[0021] Image search queries 114, and other search queries, submitted during user sessions can be stored in a data store such as the historical data store 124. Selection data specifying user actions taken after search results are provided they are also stored in a data store such as the historical data store 124. These actions can include whether a search result has been selected. The data stored in the historical data store 124 can be used to map search queries submitted during search sessions to resources that have been identified in the search results and actions taken by users. § 2.0 Example of Image Search Interfaces
[0022] Figure 2A illustrates an example of a user interface for search 200 that presents multilingual search options along with search results responsive to a query. User interface text 200 is shown in English for convenience; however, the user interface text can alternatively be in the language of the user submitting the query.
[0023] The user interface image search 200 presents three image search results 204, 206, and 208, responsive to the user query "外國人" 202, which means "alien" in Chinese. In some cases, the user is satisfied with the image search results 204, 206, and 208. However, the user may also be disappointed with the search results 204, 206, and 208, for example, due to the user wanting more search results, or because the user wants different search results. For example, if the user was really interested in image search results relevant to the movie "E.T .: The Extra Terrestrial," the image search results 204, 206, and 208 will not be what the user is interested in.
[0024] If a user is fluent in other languages, a user can resubmit the query in a different language. However, users are often unfamiliar with other languages. To assist users in selecting appropriate search queries in other languages, user interface 200 features several multilingual search options 210. Each multilingual search option 210 corresponds to a translation of query 202. For example, search option multi-language search 212 corresponds to the English term "ET". Some of the 210 multilingual search options are in the same language as others, and others are in different languages. In several other implementations, the multilingual search options 210 may be all m, or may be all in different languages.
[0025] Each multilingual search option 210 includes a translation identification, and a preview of image search results responsive to the translation, when the translation is used as an image search query. The preview can be, for example, an image from one of the image search results responsive to translation. The preview can alternatively or additionally be other information that describes image search results responsive to translation. For example, the multi-language search result 212 includes the approximate translation 214 "E.T.," and an image 216 of a translation-responsive image search result. The multilingual search result 212 also includes an identification of the translation language, "English" 218, and the estimated, or actual, amount of image search results responsive to the translation.
[0026] A user can select one of the multilingual search options 210, for example, by pressing the search option with a mouse (mouse) or keyboard, or select the search option with other input devices. In response to the selection, a new user interface is presented to the user which includes search results responsive to the translation that correspond to the selected multilingual search option. The multilingual search options 210 therefore allow a user to expand their search to other languages. The preview image provides the user with a visual representation of search results responsive to translation. This allows the user to understand what types of images will be presented for a translation, even if the user is not familiar with the language of the translation.
[0027] Although multilingual search options 210 are shown below for search results 204, 206, and 208 responsive to query 202, multilingual search options 210 may appear in other locations in user interface 200, which include, but are not limited to, are not limited to, the search results above, to the left of the search results, to the right of the search results, or mixed with the search results.
[0028] Figure 2B illustrates an example image search interface 250 that presents image search results 252 responsive to the "alien" search query. User interface 250 is presented to a user after the user selects the multi-language search option 222 that corresponds to the "alien" translation in Figure 2A.
[0029] User interface 250 includes image search results 252 responsive to a query for "alien." These image search results 252 include the image search result 254. The image search result image 254 was the preview image for the multi-language search option 222 shown in Figure 2A. The image search results 252 are different from the image search results 204, 206, and 208 shown in Figure 2A. The search results 252 are different because they were identified by a search engine as being responsive to the expression "alien," as opposed to "外國人" Therefore, different image search results were provided to the user via multi-language search option 222.
[0030] In addition to the image search results 252, user interface 250 also includes a 256 identification of the translation, "alien," and an identification of the user's original query 258, "外國人". The user can select the original query 258. If the user does this, the user will be returned to the user interface 200 shown in Figure 2A.
[0031] User interface 250 also includes multilingual search options 260. Multilingual search options 260 are multilingual search options for the "alien" translation. However, in other implementations, multilingual search options for the original query “外國人” can be shown alternatively or additionally. § 3.0 Example of Image Search System Architecture
[0032] Figure 3 is a block diagram illustrating an example of an image search system architecture 110. The image search system 110 receives an image search query 114 from a user device 106 over a network 102, and in response to search query 114 sends instructions 122 to user device 106. The instructions cause user device 106 to display the user interface described above with reference to Figures 2A. The search appliance 110 can also receive selections 302 from user device 106. In response to these selections 302 the search appliance 110 sends additional instructions 122 to user device 106 to make the user interface have additional user interfaces, for example, the user interface described above with reference to Figures 2A and 2B.
[0033] The image search system 110 includes a translation engine 120, a search engine 116, and a search engine 118.
[0034] The translation engine 120 receives the image search query 114 and generates one or more translations 304 of the image search query 114. Each translation 304 is in a different language than the language of the image search query 114; however, multiple 304 translations can be provided in the same language. These 304 translations are provided for search engine 116 and instruction engine 118. The translation engine is described in more detail below, with reference to Figure 4.
[0035] Search engine 116 receives image search query 114 and translations 304. Search engine 116 identifies image search results 306 responsive to image search query 114, and image search results 308 responds to each of the 304 translations, for example, as described above with reference to Figure 1. In some implementations, search engine 116 sorts search results differently, or searches for a different resource index, based on a language associated with the user who submitted the image search query 114. The language can be associated with the user, for example, according to one or more preferences specified by the user, or according to the user's geographic location. Alternatively, search engine 116 can match the language of each translation with the user when it identifies image search results responsive to the translation.
[0036] Instruction engine 118 receives translations 304, search results 306 for the image search query, and search results 308 for translations, and generates instructions to send to user device 106. Instruction mechanism 118 can generate instructions to make the user device display multiple user interfaces.
[0037] For example, instruction engine 118 can generate instructions, which when executed by user device 106, cause user device 106 to present a user interface that includes search results 306 responsive to image search query 114, as well as a multi-language search option that corresponds to each of the 304 translations. An example of this user interface is described above, with reference to Figure 2A. The multilingual search option for a translation includes translation and a preview of the 308 image search results responsive to the translation. In some implementations, the preview is an image that corresponds to one of the 308 image search results for the translation. The instruction engine 118 can select the image, for example, by selecting the image that corresponds to the image search result with the highest ranking, selecting the image that corresponds to an image search result selected at random in a greater number of search results. search, or selecting the image whose corresponding image search result is most frequently selected by users, for example, as indicated by historical data such as historical data in the historical data store 124 described above with reference to Figure 1.
[0038] In response to a user selecting one of the multilingual search options, instruction engine 118 generates instructions that, when executed by user device 106, cause user device 106 to present a user interface that includes the results of search 308 responsive to translation that match the selected multi-language search option. An example of this user interface is described above, with reference to Figure 2B.
[0039] Since the user device features the user interface that includes the search results 308 responsive to the translation that corresponds to the selected multilingual search option, the user can select the original query in the user interface. In response to this selection, instruction engine 118 generates instructions that, when executed by user device 106, cause user device 106 to once again present a user interface that includes search results 306 responsive to querying search 114, as well as the multilingual search options that correspond to the 304 translations. § 3.1 Example of Translation Mechanism
[0040] Figure 4 is a block diagram illustrating an example of translation engine 120. Translation engine 120 receives an image search query 114 and generates one or more translations 304 of the image search query. The translation mechanism example 120 includes a translation generator 402, a translation score 404, and a translation selector 406.
[0041] The translation generator 402 processes the image search query 114 to generate one or more candidate translations 408. As described above, candidate translations 408 can include exact translations and close translations. The translation generator can generate the exact 408 translations, for example, using conventional techniques, for example, retrieving translations for the query terms from a dictionary that associates terms with their translations, or using various translation machine techniques. The system can generate approximate translations in several ways. For example, rough translations can be retrieved from a database that associates words with their rough translations. The approximate translations can be derived from search results, selections, and other similar signals. For example, if two queries have similar responsive search results, or if users select responsive search results similar to the two queries, then the terms of one query can be associated with the terms of the other queries as an approximate translation, and vice versa . Search results can be similar if there is a statistically significant overlap in search results. Search results selected by users can be determined to be similar if there is a statistically significant overlap between the selected search result groups.
[0042] Translation scorer 404 receives candidate translations 408 and generates a score for each of the candidate translations. The score can be based on one or more factors that include, but are not limited to, measurements of candidate 408 translation as a query, measurements of search results responsive to candidate 408 translation as a query, and measures of language promotion.
[0043] Measurements of candidate 408 translation as a query include, but are not limited to, a percentage of clicks for candidate 408 translation, a measurement of unique users for candidate 408 translation, and a measurement of submission frequency for candidate 408 translation. The percentage of clicks for the candidate translation 408 is the number of times users select one or more search results in response to a query divided by the total number of times users submit the query.
[0044] The measurement of unique users estimates the number of unique users who submit the candidate translation as a query and or the number of unique user devices used to submit the query. The 404 translation scorer can estimate the number of unique users who submitted the candidate translation using various techniques, for example, by counting the number of unique Internet Protocol (IP) addresses from which the query was submitted, or by counting an amount of cookies designated by a search engine for users who submitted the query. In some implementations, the data used to identify the number of unique users is made anonymous to protect privacy. For example, each user can be identified by an Internet Protocol (IP) address of a corresponding user device, or can be identified according to a unique random number that is associated with the user device's IP address. Cookie data can be made anonymous in a similar way. Therefore, user data is not associated and does not identify a particular user. Other processes to make anonymous, such as dispersion, encryption and obscuration techniques, can also be used to ensure that the user's privacy is protected.
[0045] The submission measurement frequency re-represents how often candidate 408 translation is received from a user as a search query, or alternatively as an image search query. The frequency of measurement of submission can be determined, for example, by analyzing data in the historical data store 124. For example, if the historical data store includes which search queries, or image search queries, users have submitted, to frequency of submission measurement can be determined by counting the number of times a query for candidate translation 408 occurs in the data. In some implementations, the submission measurement frequency is determined by dividing that count by the total number of queries that users have submitted.
[0046] In some implementations, the submission measurement frequency has a value when the query is a long-tail query, and a different value when the query is not a long-tail query. The translation scorer can determine whether a query is a long-tail query according to various metrics. For example, in some implementations, a query is a long-tail query when it is submitted with a frequency that is less than a threshold. In other implementations, a query is a long-tail query when the measurement of unique users for the query falls below a threshold. In other implementations, a query is a long-tail query when the temporal or geographical standard deviation of the number of times the query is submitted is below a threshold, for example, when the query was submitted primarily over a short period of time or by users in a small geographic region.
[0047] Measurements of search results responsive to candidate 408 translation as a query include, but are not limited to, a measurement of quantity of results and / or a measurement of visual similarity. Measurement of quantity of results is the quantity of search results responsive to candidate translation 408, when candidate translation 408 is used as an image search query. Visual similarity measurement is an estimate of visual similarity between a group of search results responsive to the image search query 114 and a group of search results responsive to candidate translation 408 when the candidate translation is used as a search query for Image.
[0048] Language promotion measurements estimate interrelationship between the language of the image search query 114 and the language of the candidate translation 408. For example, a language promotion measurement may be a frequency of review measurement. The review measurement frequency measures how often search queries issued in the language of the image search query 114 are revised for search queries in the language of the corresponding translation candidate 408. A given search query in the language of the image search query 114 is considered to have been revised for a search query in the language of the corresponding candidate translation 408 a user submits a given search query and then submits the corresponding search query. The corresponding search query is an exact translation, or rough translation, of the given search query. For example, if the language of the image search query 114 is English and the language of the candidate translation 408 is French, a user submits a search for "Eiffel Tower" and then submits a search for "De Toren Van Eiffel," a scorer of translation 404 can consider "De Toren Van Eiffel" being a corresponding search query, due to this being and French translation of the English expression "Eiffel Tower".
[0049] Translation score 404 can determine the frequency of review measurement for the languages in the image search query 114 and a candidate translation 408, for example, from an analysis of the data in the historical data store 124. The corresponding search query can be an exact translation, for example, as determined according to conventional translation techniques. The corresponding search query may alternatively be a rough translation, for example, a translation that differs in the use of slang or differs in the use of articles such as "one" or "o". The corresponding search query can be submitted immediately following the given search query, or it can be submitted within a predetermined window such as a predetermined amount of queries or a predetermined amount of time. The window can be determined empirically, for example.
[0050] In general, a high frequency of review measurement indicates overlap in search between the language of the image search query 114 and candidate translation 408, and therefore is an indication that candidate translation 408 should be presented to a user.
[0051] Once the 404 translation score identifies the factors, the 404 translation score uses the factors to generate a score, for example, by taking a weighted sum, or weighted average, of the factors. The weights for a sum or weighted average can be determined, for example, empirically. In some implementations, the 404 translation scorer normalizes the factors before they are combined.
[0052] The translation selector 406 receives candidate translations 408 and their scores 410 from a translation scorer 404, and selects one or more of candidate translations 408. The selected translations are then provided as translations 304 of image search query 114. The translation selector 406 selects translations 304 from candidate translations 408 according to scores for candidate translations 408. For example, translation selector 406 can select a higher number of candidate translations 408 ranked in order of score, or select all candidate translations 408 that have a score that satisfies a limit. The quantity and the limit can be determined, for example, empirically.
[0053] While the above description focuses on a translation engine where candidate translations 408 are generated, and then translations 304 are selected from candidate translations 408, in other implementations, translation engine 120 may also use different selection methods, or alternatively you can select all the translations that are generated. § 4.0 Example of Image Search User Interface Generation Process
[0054] Figure 5 is a flowchart of an example image search user interface generation process 500. Process 500 is used to generate instructions that cause a client device to present a user interface that includes image search results responsive to a query, and multilingual search options that match the query. Process 500 can be implemented, for example, by the image search system 110.
[0055] Process 500 receives an image search query and image search results responsive to the image search query (502). The image search query can be received from a user device, for example, as described above with reference to Figure 3. Image search results can be received, for example, from a search engine, as described above with reference to Figure 3. Process 500 then obtains one or more translations of the image search query (504), for example, as described above with reference to Figures 3 and 4. Process 500 then receives image search results responsive to each of the translations of the image search query, when each translation is used as an image search query (506), for example, as described above with reference to Figure 3. The system then provides instructions to a client device (508 ). When the instructions are executed by the client device, they cause the client device to present a user interface that includes image search results responsive to the image search query, and a respective multilingual search option for each of the translations. Each multi-language search option is selectable from the user interface. This step is described in more detail above with reference to Figure 3.
[0056] In some implementations, process 500 additionally provides additional instructions for the client device in response to a selection of one of the multilingual search options. These instructions give the client device a second user interface that includes the translation of selected multi-language search options, and image search results that are responsive to translation. An example of this user interface is described above, with reference to Figure 2B. Process 500 may additionally provide additional instructions for the client device in response to the selection of the original image search query on the second user interface. These instructions give the client device a user interface that includes image search results responsive to the original image search query.
[0057] Object modalities and the functional operations described in this specification can be implemented in a set of digital electronic circuits, or in software, firmware, or computer hardware that includes the structures disclosed in this specification and their structural equivalents, or in combinations of one or more of the themselves. Modalities of the object described in this specification can be implemented as one or more computer programs, that is, one or more modules of computer program instructions encoded in a computer storage medium for execution, or to control the operation of a Data processing. Alternatively or additionally, the program instructions can be encoded in a propagated signal that is an artificially generated signal, for example, an electrical signal generated by a machine, optical, or electromagnetic, which is generated to encode information for transmission to receiving devices suitable for execution by a data processing device. The computer storage medium can be a machine-readable storage device, a machine-readable storage substrate, a random or serial access memory device, or a combination of one or more of them.
[0058] The term "data processing device" encompasses all types of data processing devices, devices, and machines, which includes, for example, a programmable processor, a computer, or multiple processors or computers. The devices may include a set of special purpose logic circuits, for example, an FPGA (field programmable port arrangement) or an ASIC (application specific integrated circuit) .The devices can also include, in addition, the hardware, code that creates an execution environment for the computer program in question, for example, code that constitutes processor firmware, a protocol stack, a database management system, an operating system, or a combination of one or more of them.
[0059] The computer program (also known as a program, software, software application, script, or code) can be written in any form of programming language, which includes interpreted or compiled languages, or declarative or procedural languages, and it can be developed in any form, which includes as a standalone program or a module, component, subroutine, or other units suitable for use in a computational environment. A computer program can, but does not have to, correspond to a file in a file system. A program can be stored in a part of a file that holds other programs or data (for example, one or more scripts stored in a markup language document), in a single dedicated to the program in question, or in multiple coordinated files (for example, files that store one or more modules, subprograms, or pieces of code). A computer program can be deployed to run on one computer or multiple computers that are located in one facility or distributed across multiple facilities and interconnected over a communication network.
[0060] The processes and logic flows described in this specification can be performed by one or more programmable processors that execute one or more computer programs perform functions operating on data input and generating output. Logical processes and flows can also be performed by, and devices can also be implemented as, a set of special purpose logic circuits, for example, an FPGA (field programmable port arrangement) or an ASIC (application specific integrated circuit) .
[0061] Suitable processors for running a computer program include, for example, microprocessors for both general and special purposes, and any one or more processors for any type of digital computer. Generally, a processor will receive instructions and data from a read-only memory or a random access memory or both. The essential elements of a computer are a processor to carry out or execute instructions and one or more memory devices to store instructions and data. Generally, a computer will also include, or be operationally coupled to receive data from or transfer data to, or both, one or more mass storage devices for storing data, for example, magnetic disks, optical magnetic disks, or optical disks. However, a computer does not need to have these devices. In addition, a computer can be embedded in other devices, for example, a mobile phone, a personal digital assistant (PDA), a mobile audio or video player, a game console, a Global Positioning System (GPS) receiver , or a portable storage device (for example, a universal serial bus (USB) flash memory unit), to name just a few.
[0062] Computer-readable media suitable for storing computer program instructions and data include all forms of memory, media and non-volatile memory devices, which include, for example, semiconductor memory devices, eg EPROM, EEPROM, and memory devices. flash memory; magnetic disks, for example, internal hard drives or removable disks; optical-magnetic disks; and CD-ROM and DVD-ROM discs. The processor and memory can be supplemented by, or incorporated into, a special purpose logic circuitry.
[0063] To provide interaction with a user, modalities of the object described in this specification can be implemented on a computer that has a display device, for example, a CRT (cathode ray tube) or LCD (liquid crystal display) monitor, to display information for the user and a keyboard and pointing device, for example, a mouse (mouse) or a scroll ball (trackball), through which the user can provide inputs to the computer. Other types of devices can be used to provide interaction with a user as well; for example, feedback information provided to the user can be any form of sensory feedback information, for example, visual feedback information, auditory feedback information, or tactile feedback information; and user input can be received in any form, which includes acoustic, voice, or tactile input. Additionally, a computer can interact with a user by sending documents to and receiving documents from a device that is used by the user; for example, sending web pages to a web browser on a user's client device in response to a request received from the web browser.
[0064] Modalities of the object described in this specification can be implemented in a computer system that includes a backup component, for example, as a data server, or that includes an intermediate component (middleware), for example, an application server, or that includes a front-end component, for example, a client computer that has a graphical user interface or a web browser through which a user can interact with an implementation of the object described in this specification, or any combination of one or more of these rear, intermediate or front components. The system components can be interconnected by any form or means of digital data communication, for example, the communication network. Examples of communication networks include a local area network ("LAN") and a wide area network ("WAN"), for example, the Internet.
[0065] The computer system can include clients and servers. A client and server are usually remote from each other and typically interact over a communication network. The client-server relationship arises because of computer programs that run on the respective computers and that have a client-server relationship with each other.
[0066] Although this specification contains many specific implementation details, these should not be construed as limitations on the scope of any invention or what is being claimed, but rather as descriptions of characteristics that may be specific to particular modalities of inventions. Certain characteristics that are described in this specification in the context of separate modalities can also be implemented in combination in a single modality. In contrast, several features that are described in the context of a single modality can also be implemented in multiple modalities separately or in any suitable subcombination. In addition, although features can be described above as acting on certain combinations and even initially claimed as such, one or more features of a claimed combination can in some cases be removed from the combination, and the claimed combination can be directed to a subcombination or variation subcombination.
[0067] Similarly, although operations are represented in the drawings in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in any sequential order, or that all the operations represented are performed, to obtain the results desired. In certain circumstances, multitasking and parallel processing can be advantageous. In addition, the separation of various system components in the modalities described above should not be understood as requiring this separation in all modalities, and it should be understood that the program components and systems described can generally be integrated into a single software product or bundled in multiple software products.
[0068] Particular modalities of the object have been described. Other modalities are within the scope of the following claims. For example, the actions listed in the claims can be performed in a different order and still achieve the desired results. As an example, the processes represented in the attached Figures do not necessarily require the particular order shown, or sequential order, to obtain the desired results. In certain implementations, multitare-fa and parallel processing can be advantageous. As another example, although the description above focuses on image search, similar techniques can be used to present multilingual search options in response to other types of queries.
权利要求:
Claims (10)
[0001]
Computer implemented method, comprising: receive (502), on a data processing device, a first image search query and first image search results that are responsive to the first image search query, where the first image search query is one or more terms in a first language; obtain (504), by the data processing apparatus, translations of the first image search query, where each translation is a translation of the first image search query into a respective second language other than the first language and in which at least two the translations of the first image search query are translations of the first image search query in the same second language as each other; characterized by the fact that obtaining, by the data processing apparatus, one or more translations selected for the first image search query comprises: receive a plurality of candidate translations from the first image search query; determine a score for each candidate translation; and selecting the translations from among the candidate translations according to the scores, in which the selection of translations from the candidate translations according to the scores comprises selecting a higher number of candidate translations ranked in the order of the scores; receive (506), in the data processing apparatus, for each translation of the first image search query, the respective image search results that are determined to be responsive to the translation of the first image search query when the translation it is used as an image search query; providing (508) first instructions (122) to a client device (106) which, when executed by the client device, cause the client device to present a user interface (200) that includes: one or more of the first image search results (204, 206, 208) responsive to the first image search query (202); and a respective multilingual search option (210, 212, 222) for each of the translations of the first image search query, where the respective multilingual search option (212) for each translation includes the translation (214) and a preview of the respective image search results responsive to translation, where the preview of the respective image search results responsive to a respective translation of the image search query is an image (216) associated with one of respective image search results; where each multi-language search is selectable on the user interface to make the data processing device provide second instructions to the client device which, when performed by the client device, cause the client device to display, on the user interface , results of an image search in the associated translation of the first image search query; and in response to a selection of a first multi-language search option, provide, via the data processing apparatus, second instructions to the client device (106) which, when executed by the client device, cause the client device to present an interface (250) including a first translation corresponding to the first multi-language search option and the respective image search results (252) that are responsive to the first translation.
[0002]
Method, according to claim 1, characterized by the fact that: the second instructions additionally include instructions that, when executed by the client device, cause the client device to present the first image search query; the first image search query is selectable from the user interface; and in response to a selection of the first image search query, it provides third instructions to the client device that, when executed by the client device, cause the client device to present a user interface that includes the first image search results.
[0003]
Method, according to claim 1, characterized by the fact that determining a score for a candidate translation comprises determining the score for the candidate translation from a submission measurement frequency that measures how often the candidate translation is received from users as a image search query.
[0004]
Method according to claim 5, characterized in that determining a score for a candidate translation comprises determining the score for the candidate translation from a review measurement frequency that measures how often search queries in the first language are reviewed by users for the corresponding search queries in the respective second language of the candidate translation.
[0005]
Method, according to claim 1, characterized by the fact that determining a score for a candidate translation comprises determining the score for the candidate translation from a percentage of clicks for the candidate translation when the candidate translation is submitted to a query. search, where the percentage of clicks measures how often users select search results responsive to the candidate translation.
[0006]
Method, according to claim 1, characterized by the fact that determining a score for a candidate translation comprises determining the score for the candidate translation from a measurement of unique users that estimates the number of unique users who submitted the candidate translation as a search query.
[0007]
Method according to claim 1, characterized in that determining a score for a candidate translation comprises determining the score for the candidate translation from a result measurement amount that measures a number of image search results responsive to candidate translation, when the candidate translation is submitted as an image search query.
[0008]
System, comprising: a processor; and a computer storage medium coupled to the processor and which includes instructions, which, when executed by the processor, cause the processor to perform operations that comprise, in sequence: receiving (502) a first image search query and first image search results that are responsive to the first image search query, the first image search query is one or more terms in a first language; obtain (504) translations of the first image search query, where each translation is a translation of the first image search query into a respective second language other than the first language and in which at least two of the translations of the first image search query image are translations of the first image search query in the same second language as each other, characterized by the fact that obtaining (504) one or more translations selected for the first image search query comprises: receive a plurality of candidate translations from the first image search query; determine a score for each candidate translation; and select the translations from the candidate translations according to the scores, in which selecting translations from the candidate translations according to the score comprises selecting a higher number of candidate translations ranked in the order of the scores; receiving (506) for each translation of the first image search query, respective image search results that are determined to be responsive to the translation of the first image search query when the translation is used as an image search query; providing (508) first instructions (122) to a client device (106) which, when executed by the client device, cause the client device to present a user interface (200) that includes: one or more of the first image search results (204, 206, 208) responsive to the first image search query; and a respective multilingual search option (210, 212, 222) for each of the translations of the first image search query, where the respective multilingual search option (212) for each translation includes the translation (214) and a preview of the respective translation-responsive image search results where the preview of the respective image search results responsive to a respective translation of the image search query is an image (216) associated with one of the respective image search results, where each multi-language search result is selectable in the user interface to make the data processing device provide second instructions to the client device which, when executed by the client device, causes the device client presents, in the user interface, results of an image search in relation to the associated translation of the first image search query; and in response to a selection of a first multilingual search result, provide second instructions to the client device (106) which, when executed by the client device, cause the client device to present a user interface (250) that includes a first translation that corresponds to the first multilingual search result and the respective image search results (252) that are responsive to the first translation.
[0009]
System, according to claim 8, characterized by the fact that: the second instructions additionally include instructions that, when executed by the client device, cause the client device to present the first image search query; the first image search query is selectable from the user interface; and in response to a selection of the first image search query, it provides third instructions to the client device that, when executed by the client device, cause the client device to present a user interface that includes the first image search results.
[0010]
Computer storage medium encoded with instructions, in which the instructions are operable to make a data processing device perform operations that comprise, in sequence: receiving (502) a first image search query and first image search results that are responsive to the first image search query, the first image search query is one or more terms in a first language; obtain (504) translations of the first image search query, where each translation is a translation of the first query of the first image search into a respective second language other than the first language and in which at least two of the translations of the first search query images are translations of the first image search query in the same second language as each other, characterized by the fact that obtaining (504) one or more translations selected for the first image search query comprises: receive a plurality of candidate translations from the first image search query determine a score for each candidate translation; and selecting the translations from among the candidate translations according to the scores, where selecting the translations from the candidate translations according to the scores comprises selecting a higher number of candidate translations ranked in the order of the scores; receiving (506) for each translation of the first image search query, respective image search results that are determined to be responsive to the translation of the first image search query when the translation is used as an image search query; provide (508) first instructions (122) to a client device (106) which, when executed by the client device, cause the client device to have a user interface that includes: one or more of the first image search results (204, 206, 208) responsive to the first image search query; and a respective multi-language search option (210, 212, 222) for each of the translations of the first image search query, where the respective multi-language search option (212) for each translation includes the translation and a pre -view of the respective image search results responsive to translation, where the preview of the respective image search results responsive to a respective translation of the image search query is an image (216) associated with one of the respective search results image search, where each multi-language search result is selectable in the user interface to make the data processing device provide second instructions to the client device which, when executed by the client device, causes the client device to present , in the user interface, results of an image search in relation to the associated translation of the first image search query; and in response to a selection of a first multilingual search result, provide second instructions to the client device (106) which, when executed by the client device, cause the client device to present a user interface (250) including a first translation corresponding to the first multilingual search option and the respective image search results (252) that are responsive to the first translation.
类似技术:
公开号 | 公开日 | 专利标题
BR112012012133B1|2021-01-12|computer-implemented method, system and computer storage medium for multi-language search options
US9589071B2|2017-03-07|Query suggestions from documents
US20160378858A1|2016-12-29|Clustering of search results
US8977612B1|2015-03-10|Generating a related set of documents for an initial set of documents
EP2438539B1|2018-08-08|Co-selected image classification
US9679027B1|2017-06-13|Generating related questions for search queries
US8856125B1|2014-10-07|Non-text content item search
JP2010513997A|2010-04-30|Online computer-assisted translation
US8832096B1|2014-09-09|Query-dependent image similarity
US9183499B1|2015-11-10|Evaluating quality based on neighbor features
US9218546B2|2015-12-22|Choosing image labels
US11182564B2|2021-11-23|Text recommendation method and apparatus, and electronic device
US9916384B2|2018-03-13|Related entities
US9116979B2|2015-08-25|Systems and methods for creating an interest profile for a user
US20160217181A1|2016-07-28|Annotating Query Suggestions With Descriptions
US9152701B2|2015-10-06|Query classification
US10223461B1|2019-03-05|Identifying languages relevant to resources
Masuda et al.2007|Video scene retrieval using online video annotation
US9648130B1|2017-05-09|Finding users in a social network based on document content
US20150161127A1|2015-06-11|Ranking entity realizations for information retrieval
US20210073203A1|2021-03-11|Method of and system for identifying abnormal rating activity
WO2021053391A1|2021-03-25|Multilingual search queries and results
同族专利:
公开号 | 公开日
CA2781321C|2017-07-11|
TW201137650A|2011-11-01|
EP2502161A4|2014-01-22|
CA2781321A1|2011-05-26|
CN102770859A|2012-11-07|
US8856162B2|2014-10-07|
TWI480747B|2015-04-11|
EP2502161B1|2018-09-26|
CN102770859B|2017-05-03|
EP2502161A1|2012-09-26|
US20150019582A1|2015-01-15|
WO2011060565A1|2011-05-26|
KR101689314B1|2016-12-23|
KR20120135188A|2012-12-12|
BR112012012133A2|2018-03-13|
US9177018B2|2015-11-03|
US20120233196A1|2012-09-13|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题

JPH11120185A|1997-10-09|1999-04-30|Canon Inc|Information processor and method therefor|
GB2337611A|1998-05-20|1999-11-24|Sharp Kk|Multilingual document retrieval system|
US7516154B2|2000-06-28|2009-04-07|Qnaturally Systems Inc.|Cross language advertising|
CN1492354A|2000-06-02|2004-04-28|钧 顾|Multilingual information searching method and multilingual information search engine system|
TWI221234B|2001-10-23|2004-09-21|Yeun-Jonq Lee|An improved method of data inquiry system|
WO2004049196A2|2002-11-22|2004-06-10|Transclick, Inc.|System and method for speech translation using remote devices|
KR20040059240A|2002-12-28|2004-07-05|엔에이치엔|A method for providing multi-language translation service and a system of enabling the method|
US7634472B2|2003-12-01|2009-12-15|Yahoo! Inc.|Click-through re-ranking of images and other data|
CA2630683C|2005-11-23|2014-10-28|Anthony Scriffignano|System and method for searching and matching data having ideogrammatic content|
US7644373B2|2006-01-23|2010-01-05|Microsoft Corporation|User interface for viewing clusters of images|
US7853555B2|2006-04-19|2010-12-14|Raytheon Company|Enhancing multilingual data querying|
US20090024599A1|2007-07-19|2009-01-22|Giovanni Tata|Method for multi-lingual search and data mining|
US20090083243A1|2007-09-21|2009-03-26|Google Inc.|Cross-language search|
WO2011060565A1|2009-11-20|2011-05-26|Google Inc.|Cross-language search options|WO2011060565A1|2009-11-20|2011-05-26|Google Inc.|Cross-language search options|
US8375025B1|2010-12-30|2013-02-12|Google Inc.|Language-specific search results|
JP5876141B2|2011-04-28|2016-03-02|マイクロソフト テクノロジー ライセンシング,エルエルシー|Alternative market search results toggle|
US8538946B1|2011-10-28|2013-09-17|Google Inc.|Creating model or list to identify queries|
US9323746B2|2011-12-06|2016-04-26|At&T Intellectual Property I, L.P.|System and method for collaborative language translation|
US9684653B1|2012-03-06|2017-06-20|Amazon Technologies, Inc.|Foreign language translation using product information|
JP2014056503A|2012-09-13|2014-03-27|International Business Maschines Corporation|Computer packaging method, program, and system for specifying non-text element matching communication in multilingual environment|
US20140164422A1|2012-12-07|2014-06-12|Verizon Argentina SRL|Relational approach to systems based on a request and response model|
US9183261B2|2012-12-28|2015-11-10|Shutterstock, Inc.|Lexicon based systems and methods for intelligent media search|
US9183215B2|2012-12-29|2015-11-10|Shutterstock, Inc.|Mosaic display systems and methods for intelligent media search|
US9098552B2|2013-02-05|2015-08-04|Google Inc.|Scoring images related to entities|
US9760803B2|2013-05-15|2017-09-12|Google Inc.|Associating classifications with images|
US8819006B1|2013-12-31|2014-08-26|Google Inc.|Rich content for query answers|
US20150199412A1|2014-01-10|2015-07-16|Htc Corporation|Mobile communications device, non-transitory computer-readable medium and method for displaying a search result cover page and switching from the search result cover page to a search result page|
KR101480837B1|2014-10-27|2015-01-13|국방과학연구소|Metheo for extracting and connecting of insentient object between language-cross based on link structure|
US10452786B2|2014-12-29|2019-10-22|Paypal, Inc.|Use of statistical flow data for machine translations between different languages|
EP3241382B1|2014-12-30|2020-06-24|LG Electronics Inc.|Method and apparatus for transmitting buffer status report for bi-directional transmission in wireless communication system|
CN108255917B|2017-09-15|2020-12-18|阿里巴巴(中国)有限公司|Image management method and device and electronic device|
KR102353687B1|2021-04-14|2022-01-21|주식회사 퍼넥티브|Server for providing service for educating english and method for operation thereof|
法律状态:
2018-06-12| B25D| Requested change of name of applicant approved|Owner name: GOOGLE LLC (US) |
2019-01-22| B06F| Objections, documents and/or translations needed after an examination request according art. 34 industrial property law|
2019-07-16| B06T| Formal requirements before examination|
2020-02-11| B07A| Technical examination (opinion): publication of technical examination (opinion)|
2020-08-11| B06A| Notification to applicant to reply to the report for non-patentability or inadequacy of the application according art. 36 industrial patent law|
2020-11-17| B09A| Decision: intention to grant|
2021-01-12| B16A| Patent or certificate of addition of invention granted|Free format text: PRAZO DE VALIDADE: 10 (DEZ) ANOS CONTADOS A PARTIR DE 12/01/2021, OBSERVADAS AS CONDICOES LEGAIS. |
优先权:
申请号 | 申请日 | 专利标题
PCT/CN2009/001287|WO2011060565A1|2009-11-20|2009-11-20|Cross-language search options|
[返回顶部]